系统信息

  • master

    os: MAC OSX 10.10
    ip: 192.168.2.108
    hostname: master
  • slaves1

    os: MAC OSX 10.10
    ip: 192.168.2.104
    hostname: s1
    

/etc/hosts编辑

master和slaves1主机上,对/etc/hosts编辑:

# add config
192.168.2.108   master
192.168.2.104   s1

ssh免密码登录配置

http://www.jianshu.com/p/1fdc...

hadoop安装

  • 部署主目录准备

mkdir -p ~/work/hadoop
mkdir -p ~/work/hadoop/hadoop
mkdir -p ~/work/hadoop/hbase
  • hadoop安装目录准备

    <1>. 下载hadoop2.7.3.tar.gz
    <2>. 解压
    <3>. 拷贝至hadoop的部署目录下,重命名
    
cp ~/Documents/tool/hadoop/hadoop-2.7.3.tar.gz ~/work/hadoop/
tar xzvf hadoop-2.7.3.tar.gz 
mv hadoop-2.7.3.tar.gz hadoop
  • hadoop配置目录准备

mkdir -p ~/work/hadoop/hadoop-config

base_profile准备

# config java env
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home
e
# config hadoop
export HADOOP_CONF_DIR=/Users/jingchen/work/hadoop/hadoop-config
export HBASE_CONF_DIR=/Users/jingchen/work/hadoop/hbase-config
export HADOOP_HOME=/Users/jingchen/work/hadoop/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME

## config native lib of hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="${HADOOP_OPTS} -Djava.library.path=${HADOOP_HOME}/lib/native/"
export HADOOP_ROOT_LOGGER=DEBUG,console

hadoop配置文件

  • 配置文件准备
    将hadoop/etc目录下的配置文件拷贝到hadoop-config目录,并且对一下几个配置文件作修改

core-site.xml            hdfs-site.xml            mapred-site.xml.template yarn-env.cmd             yarn-site.xml
hadoop-env.sh                    slaves                   yarn-env.sh

slaves

添加slaves节点,一行一个

s1

core-site.xml

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://master:9000</value>
</property>

<property>
<name>hadoop.tmp.dir</name>
<value>/Users/jingchen/work/hadoop/hadoop/tmp</value>
</property>

</configuration>

hdfs-site.xml

<configuration>
<property>
<name>dfs.namenode.secondary.http-address</name>
  <value>hdfs://master:9001</value>
 </property>
<property>
<name>dfs.namenode.name.dir</name>
<value>/Users/jingchen/work/hadoop/hadoop/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/Users/jingchen/work/hadoop/hadoop/data/</value>
</property>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>

</configuration>

mapred-site.xml

<configuration>
<!--
<property>
<name>mapred.job.tracker</name>
<value>hdfs://master:9001/</value>
</property>
-->

<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>

<property>
<name>mapreduce.jobhistory.address</name>
<value>master:10020</value>
</property>

<property>
<name>mapreduce.jobhistory.webapp.address</name>
<value>master:19888</value>
</property>

</configuration>

yarn-env.sh

# add java env conf
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home

# add native lib conf
YARN_OPTS="$YARN_OPTS -Djava.library.path=${HADOOP_HOME}/lib/native/"

yarn-site.xml

<configuration>

<!-- Site specific YARN configuration properties -->
<property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
</property>

<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>

<property>
<name>yarn.resourcemanager.address</name>
<value>master:8032</value>
</property>

<property>
<name>yarn.resourcemanager.scheduler.address</name>
<value>master:8030</value>
</property>

<property>
<name>yarn.resourcemanager.resource-tracker.address</name>
<value>master:8031</value>
</property>

<property>
<name>yarn.resourcemanager.admin.address</name>
<value>master:8033</value>
</property>

<property>
<name>yarn.resourcemanager.webapp.address</name>
<value>master:8088</value>
</property>

</configuration>

hadoop-env.sh

# add config item
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home"
export HADOOP_PID_DIR="/Users/jingchen/work/hadoop/hadoop/tmp"
export HADOOP_SECURE_DN_PID_DIR=${HADOOP_PID_DIR}

# A string representing this instance of hadoop. $USER by default.
#export HADOOP_OPTS="$HADOOP_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf=/dev/null"
export HADOOP_ROOT_LOGGER=DEBUG,console
export HADOOP_COMMON_LIB_NATIVE_DIR="${HADOOP_HOME}/lib/native"
export HADOOP_OPTS="${HADOOP_OPTS} -Djava.library.path=${HADOOP_HOME}/lib/native/"

yarn-env.sh

# add config item
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_40.jdk/Contents/Home

hadoop文件部署到slaves上

scp -r ~/work/hadoop/hadoop  jingchen@s1:~/work/hadoop/hadoop

scp -r ~/work/hadoop/hadoop-config  jingchen@s1:~/work/hadoop/hadoop-config

MAC环境下,很重要:mac hadoop native库编译

编译native lib的原因:

hadoop的native lib存放在hadoop的lib/native目录下,本文中是~/work/hadoop/hadoop/lib/native
但是,mac 安装Hadoop后启动,报错

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable。

有兴趣的可以自己编译下,因为网上很多的native library资源和方法都是不可用的,自己在本地编译也是官网推荐的。

The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory. You can download the hadoop distribution from Hadoop Common Releases.
The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.

使用新native lib

将编译出的native library复制到下载的二进制版本的Hadoop0相应目录中

## 1. backup old native lib
mv ~/work/hadoop/hadoop/lib/native ~/work/hadoop/hadoop/lib/native_bak
## mv new native libs override old ones
cp hadoop-2.7.3-src/hadoop-dist/target/hadoop-2.7.3/lib/native/* ~/work/hadoop/hadoop/lib/native

️22_️
25 声望1 粉丝

五行缺金又缺水~